100 research outputs found

    On Quantum Effects in a Theory of Biological Evolution

    Get PDF
    We construct a descriptive toy model that considers quantum effects on biological evolution starting from Chaitin's classical framework. There are smart evolution scenarios in which a quantum world is as favorable as classical worlds for evolution to take place. However, in more natural scenarios, the rate of evolution depends on the degree of entanglement present in quantum organisms with respect to classical organisms. If the entanglement is maximal, classical evolution turns out to be more favorable

    Algorithmic statistics revisited

    Full text link
    The mission of statistics is to provide adequate statistical hypotheses (models) for observed data. But what is an "adequate" model? To answer this question, one needs to use the notions of algorithmic information theory. It turns out that for every data string xx one can naturally define "stochasticity profile", a curve that represents a trade-off between complexity of a model and its adequacy. This curve has four different equivalent definitions in terms of (1)~randomness deficiency, (2)~minimal description length, (3)~position in the lists of simple strings and (4)~Kolmogorov complexity with decompression time bounded by busy beaver function. We present a survey of the corresponding definitions and results relating them to each other

    The Road to Quantum Computational Supremacy

    Full text link
    We present an idiosyncratic view of the race for quantum computational supremacy. Google's approach and IBM challenge are examined. An unexpected side-effect of the race is the significant progress in designing fast classical algorithms. Quantum supremacy, if achieved, won't make classical computing obsolete.Comment: 15 pages, 1 figur

    Is Evolution Algorithmic?

    Get PDF
    In Darwin’s Dangerous Idea, Daniel Dennett claims that evolution is algorithmic. On Dennett’s analysis, evolutionary processes are trivially algorithmic because he assumes that all natural processes are algorithmic. I will argue that there are more robust ways to understand algorithmic processes that make the claim that evolution is algorithmic empirical and not conceptual. While laws of nature can be seen as compression algorithms of information about the world, it does not follow logically that they are implemented as algorithms by physical processes. For that to be true, the processes have to be part of computational systems. The basic difference between mere simulation and real computing is having proper causal structure. I will show what kind of requirements this poses for natural evolutionary processes if they are to be computational

    Media Ontology and Transcendental Instrumentality

    Get PDF
    This article takes inspiration from Kittler’s claim that philosophy has neglected the means used for its production. Kittler’s argument for an ontology of media invites us to reflect upon the cybernetic mechanization of logic, which has led practical or instrumental knowledge to challenge the classical division between theory and practice, ideas and demonstrations. This article suggests that procedures, tasks, and functions are part of an instrumental thinking. By drawing on information theory and the mathematical logic of constructivism, the article addresses indeterminacy within automated logic and proposes a re-habilitation of instrumentality whereby the connection between means and ends is articulated away from classical idealism and analytic realism. By following John Dewey’s argument for instrumental reasoning, the article suggests that post-Kantian critique of techne shall be revisited in order to account for a machine philosophy, which has originated from within the practical thinking of machines

    Depth, Highness and DNR Degrees

    Get PDF
    A sequence is Bennett deep [5] if every recursive approximation of the Kolmogorov complexity of its initial segments from above satisfies that the difference between the approximation and the actual value of the Kolmogorov complexity of the initial segments dominates every constant function. We study for different lower bounds r of this difference between approximation and actual value of the initial segment complexity, which properties the corresponding r(n)-deep sets have. We prove that for r(n) = εn, depth coincides with highness on the Turing degrees. For smaller choices of r, i.e., r is any recursive order function, we show that depth implies either highness or diagonally-non-recursiveness (DNR). In particular, for left-r.e. sets, order depth already implies highness. As a corollary, we obtain that weakly-useful sets are either high or DNR. We prove that not all deep sets are high by constructing a low order-deep set. Bennett's depth is defined using prefix-free Kolmogorov complexity. We show that if one replaces prefix-free by plain Kolmogorov complexity in Bennett's depth definition, one obtains a notion which no longer satisfies the slow growth law (which stipulates that no shallow set truth-table computes a deep set); however, under this notion, random sets are not deep (at the unbounded recursive order magnitude). We improve Bennett's result that recursive sets are shallow by proving all K-trivial sets are shallow; our result is close to optimal. For Bennett's depth, the magnitude of compression improvement has to be achieved almost everywhere on the set. Bennett observed that relaxing to infinitely often is meaningless because every recursive set is infinitely often deep. We propose an alternative infinitely often depth notion that doesn't suffer this limitation (called i.o. depth).We show that every hyperimmune degree contains a i.o. deep set of magnitude εn, and construct a π01- class where every member is an i.o. deep set of magnitude εn. We prove that every non-recursive, non-DNR hyperimmune-free set is i.o. deep of constant magnitude, and that every nonrecursive many-one degree contains such a set

    Integrated information increases with fitness in the evolution of animats

    Get PDF
    One of the hallmarks of biological organisms is their ability to integrate disparate information sources to optimize their behavior in complex environments. How this capability can be quantified and related to the functional complexity of an organism remains a challenging problem, in particular since organismal functional complexity is not well-defined. We present here several candidate measures that quantify information and integration, and study their dependence on fitness as an artificial agent ("animat") evolves over thousands of generations to solve a navigation task in a simple, simulated environment. We compare the ability of these measures to predict high fitness with more conventional information-theoretic processing measures. As the animat adapts by increasing its "fit" to the world, information integration and processing increase commensurately along the evolutionary line of descent. We suggest that the correlation of fitness with information integration and with processing measures implies that high fitness requires both information processing as well as integration, but that information integration may be a better measure when the task requires memory. A correlation of measures of information integration (but also information processing) and fitness strongly suggests that these measures reflect the functional complexity of the animat, and that such measures can be used to quantify functional complexity even in the absence of fitness data.Comment: 27 pages, 8 figures, one supplementary figure. Three supplementary video files available on request. Version commensurate with published text in PLoS Comput. Bio

    A formally verified compiler back-end

    Get PDF
    This article describes the development and formal verification (proof of semantic preservation) of a compiler back-end from Cminor (a simple imperative intermediate language) to PowerPC assembly code, using the Coq proof assistant both for programming the compiler and for proving its correctness. Such a verified compiler is useful in the context of formal methods applied to the certification of critical software: the verification of the compiler guarantees that the safety properties proved on the source code hold for the executable compiled code as well
    • …
    corecore